Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Optimized routing algorithm based on cooperative communication of cluster parent set for low power and lossy network
YAO Yukun, LIU Jiangbing, LI Xiaoyong
Journal of Computer Applications    2017, 37 (5): 1300-1305.   DOI: 10.11772/j.issn.1001-9081.2017.05.1300
Abstract428)      PDF (995KB)(404)       Save
To deal with the problems that the routing algorithm based on Collaborative communication of Cluster Parent (CRPL) for Low Power and Lossy Network (LLN) can't balance the energy consumption of the node and maximize the extension of the lifetime for network efficiently due to take no account of the residual energy of the node, a high-efficient routing algorithm based on collaborative communications of cluster parent set HE-CRPL was proposed. The proposed algorithm chiefly carried out three optimization schemes. Firstly, the wireless link quality and the residual energy of node could be considered during the cluster parent selection. Secondly, the wireless link quality and the Expected LifeTime (ELT) of cluster parent node were combined while estimating the priority of the cluster parent node and selecting the optimal cluster parent set. Thirdly, the cluster parent nodes were notified the priority list by Destination Advertisement Object (DAO) message during the initialization of the network topology. The simulation results show that, compared with the CRPL algorithm, the performance of the HE-CRPL algorithm is improved obviously in prolonging the network lifetime, increasing the packet delivery success rate and reducing the number of packet retransmissions, and that the lifetime of network prolonging by more than 18.7% and the number of retransmissions decrease by more than 15.9%.
Reference | Related Articles | Metrics
High efficiency medium access control protocol based on cooperative network coding
YAO Yukun, LI Xiaoyong, REN Zhi, LIU Jiangbing
Journal of Computer Applications    2017, 37 (10): 2748-2753.   DOI: 10.11772/j.issn.1001-9081.2017.10.2748
Abstract634)      PDF (992KB)(423)       Save
The transmission energy consumption of nodes does not be considered in the exiting Network Coding Aware Cooperative MAC (NCAC-MAC) protocol for Ad Hoc Network, and the control message sent by the candidate cooperative relay node can not make the other candidate nodes which are not in the communication range give up competition, thus causing collision. To deal with those problems, a high efficiency Medium Access Control (MAC) protocol based on cooperative network coding High Efficiency MAC protocol based on Cooperative Network Coding (HECNC-MAC) was proposed. Three optimization schemes were carried out by the protocol. Firstly, candidate cooperative relay node need to prejudge whether the destionation can decode the packet, so as to reduce the number of competitive relay nodes and ensure that the destination node could be successfully decoded. Secondly, the transmission energy consumption of nodes should be synthetically considered when selecting the cooperative relay node. Finally, the Eager To Help (ETH) message is canceled, and the destination node sents conformation message through pseudo-broadcast. Theoretical analysis and simulation results show that in the comparison experiments with Carrier Sense Multiple Access (CSMA), Phoenix and NCAC-MAC protocols, the transmission energy consumption of nodes and the end-to-end delay of date packages can be effectively reduced, and the network throughput can be improved by HECNC-MAC.
Reference | Related Articles | Metrics
Collaborative filtering recommendation based on entropy and timeliness
LIU Jiangdong, LIANG Gang, FENG Cheng, ZHOU Hongyu
Journal of Computer Applications    2016, 36 (9): 2531-2534.   DOI: 10.11772/j.issn.1001-9081.2016.09.2531
Abstract723)      PDF (618KB)(379)       Save
Aiming at the noise data problem in collaborative filtering recommendation, a user entropy model was put forward. The user entropy model combined the concept of entropy in the information theory and used the information entropy to measure the content of user information, which filtered the noise data by calculating the entropy of users and getting rid of the users with low entropy. Meanwhile, combining the user entropy model with the item timeliness model, the item timeliness model got the timeliness of item by using the contextual information of the rating data, which alleviated the data sparsity problem in collaborative filtering algorithm. The experimental results show that the proposed algorithm can effectively filter out noise data and improve the recommendation accuracy, its recommendation precision is increased by about 1.1% compared with the basic algorithm.
Reference | Related Articles | Metrics
Financial failure prediction using truncated Hinge loss support vector machine with smoothly clipped absolute deviation penalty
LIU Zunxiong HUANG Zhiqiang LIU Jiangwei CHEN Ying
Journal of Computer Applications    2014, 34 (3): 873-878.   DOI: 10.11772/j.issn.1001-9081.2014.03.0873
Abstract634)      PDF (878KB)(418)       Save

Aiming at the problems that the traditional Support Vector Machine (SVM) classifier is sensitive to outliers and has the large number of Support Vectors (SV) and the parameter of its separating hyperplane is not sparse, the Truncated hinge loss SVM with Smoothly Clipped Absolute Deviation (SCAD) penalty (SCAD-TSVM) was put forward and was used for constructing the financial early-warning model. At the same time, an iterative updating algorithm was proposed to solve the SCAD-TSVM model. Experiments were implemented on the financial data of A-share manufacturing listed companies of the Shanghai and Shenzhen stock markets. Compared to the T-2 and T-3 models constructed by SVM with L1 norm penalty (L1-SVM), SVM with SCAD penalty (SCAD-SVM) and Truncated hinge loss SVM (TSVM), the T-2 and T-3 model constructed by the SCAD-TSVM had the best sparseness and the highest accuracy of prediction, and its average accuracies of prediction with different number of training samples were higher than those of the L1-SVM, SCAD-SVM and TSVM algorithms.

Related Articles | Metrics
OMR image segmentation based on mutation signal detection
MA Lei LIU Jiang LI Xiao-peng CHEN Xia
Journal of Computer Applications    2012, 32 (04): 1137-1140.   DOI: 10.3724/SP.J.1087.2012.01137
Abstract1267)      PDF (636KB)(382)       Save
Concerning the accurate positioning of Optical Mark Recognition (OMR) images without any position information, an image segmentation approach of mutation signal detection based on wavelet transformation was proposed. Firstly, the horizontal and vertical projective operations were processed, and then these functions were transformed by wavelet to detect mutation points, which can better reflect the boundary of OMR information. This algorithms adaptability is based on limited times of wavelet transform and mutation signal detection. The experimental results demonstrate that the method possesses high accuracy of segmentation and stability, and the mean square error of segmentation accuracy can be 0.4167 pixels. The processing of this method is efficient because the segmentation only used the horizontal and vertical information. This algorithm is not sensitive to noise because of the statistic characteristic of projection functions and multi-resolution characteristic of wavelet tranformation.
Reference | Related Articles | Metrics
Zero watermark algorithm for binary document images based on texture spectrum
CHEN Xia WANG Xi-chang ZHANG Hua-ying LIU Jiang
Journal of Computer Applications    2011, 31 (09): 2378-2381.   DOI: 10.3724/SP.J.1087.2011.02378
Abstract1427)      PDF (611KB)(420)       Save
Concerning the copyright protection of binary document images, a zero watermark algorithm was proposed. This algorithm constructed the texture image based on Local Binary Pattern (LBP), and then zero watermark information was constructed from the texture spectral histograms of the texture image. This method had a better invisibility compared to other text image watermarking, and the original image information would not be changed. Watermark attacks including image cropping, adding noise and rotation operators were tested. The experimental results show that the proposed zero watermark algorithm has a good performance in robustness. And these attack operators have little impact on zero watermark information, and the algorithm is of stability with the lowest standard correlation above 0.85.
Related Articles | Metrics
Improved data hiding scheme based on modulus function
Kai-hui LIU Jiang-feng XU
Journal of Computer Applications    2011, 31 (07): 1917-1919.   DOI: 10.3724/SP.J.1087.2011.01917
Abstract1021)      PDF (580KB)(844)       Save
The new method proposed by Lee et al. (LEE C F, CHEN H L. A novel data hiding scheme based on modulus function. The Journal of Systems and Software, 2010, 83(5): 832-843) was based on modulus function. In this method, each pixel can carry a maximum of 4 bits with an acceptable visual quality. When each pixel is embedded with 4 bits, the quality of stegoimages is much worse; as a result, this may catch the attention of attackers. Consequently, this paper improved the method, which narrowed the range that pixels changed. The theoretical analysis and simulation results show that the new method not only keeps the various advantages of original method, but also makes the Peak Signal-toNoise Ratio (PSNR) value increase 1.5~3.5 dB. Thus, it can raise imperceptibility and the ability resisting RS attack of stego-images.
Reference | Related Articles | Metrics
Study on auto-proofreading method for POS tagging of Chinese corpus
ZHANG Hu, ZHENG Jia-heng, LIU Jiang
Journal of Computer Applications    2005, 25 (01): 17-19.   DOI: 10.3724/SP.J.2005.00017
Abstract1243)      PDF (187KB)(1318)       Save
The auto-proofreading problem in the large-scale corpus was analyzed, and a new method inspecting the correctness of POS tagging and an auto-proofreading method based on clustering and classifying were put forward. Using clustering and classifying, the method firstly classified the sequences of part of speech of the example and got the threshold value. Then according to the threshold value, it classified the test sequences to judge its correctness, and gave out a proofreading POS to the wrong POS Tagging. Furthermore, it enhanced the correctness ratio of the part of speech tagging on large-scale corpus.
Related Articles | Metrics